9,791 research outputs found

    The 125 GeV boson: A composite scalar?

    Full text link
    Assuming that the 125 GeV particle observed at the LHC is a composite scalar and responsible for the electroweak gauge symmetry breaking, we consider the possibility that the bound state is generated by a non-Abelian gauge theory with dynamically generated gauge boson masses and a specific chiral symmetry breaking dynamics motivated by confinement. The scalar mass is computed with the use of the Bethe-Salpeter equation and its normalization condition as a function of the SU(N) group and the respective fermionic representation. If the fermions that form the composite state are in the fundamental representation of the SU(N) group, we can generate such light boson only for one specific number of fermions for each group. In the case of small groups, like SU(2) to SU(5), and two fermions in the adjoint representation we find that is quite improbable to generate such light composite scalar.Comment: 24 pages, 5 figures, discussion extended, references added; version to appear in Phys. Rev.

    The use of heterocaryons in the maintenance of slime stocks of Neurospora crassa, and a method for the re-isolation of slime from heterocaryons

    Get PDF
    The use of heterocaryons in the maintenance of slime stocks of Neurospora crassa, and a method for the re-isolation of slime from heterocaryon

    The Importance of Statistical Theory in Outlier Detection

    Get PDF
    We explore the performance of the outlier-sum statistic (Tibshirani and Hastie, Biostatistics 2007 8:2--8), a proposed method for identifying genes for which only a subset of a group of samples or patients exhibits differential expression levels. Our discussion focuses on this method as an example of how inattention to standard statistical theory can lead to approaches that exhibit some serious drawbacks. In contrast to the results presented by those authors, when comparing this method to several variations of the tt-test, we find that the proposed method offers little benefit even in the most idealized scenarios, and suffers from a number of limitations including difficulty of calibration, high false positive rates owing to its asymmetric treatment of groups, poor power or discriminatory ability under many alternatives, and poorly defined application to one-sample settings. Further issues in the Tibshirani and Hastie paper concern the presentation and accuracy of their simulation results; we were unable to reproduce their findings, and we discuss several undesirable and implausible aspects of their results

    Genetic nature of the slime variant of Neurospora crassa

    Get PDF
    Genetic nature of the slime variant of Neurospora crass

    Some Observations on the Wilcoxon Rank Sum Test

    Get PDF
    This manuscript presents some general comments about the Wilcoxon rank sum test. Even the most casual reader will gather that I am not too impressed with the scientific usefulness of the Wilcoxon test. However, the actual motivation is more to illustrate differences between parametric, semiparametric, and nonparametric (distribution-free) inference, and to use this example to illustrate how many misconceptions have been propagated through a focus on (semi)parametric probability models as the basis for evaluating commonly used statistical analysis models. The document itself arose as a teaching tool for courses aimed at graduate students in biostatistics and statistics, with parts of the document originally written for applied biostatistics classes and parts written for a course in mathematical statistics. Hence, some of the material is also meant to provide an illustration of common methods of deriving moments of distributions, etc

    Scalable Noise Estimation with Random Unitary Operators

    Full text link
    We describe a scalable stochastic method for the experimental measurement of generalized fidelities characterizing the accuracy of the implementation of a coherent quantum transformation. The method is based on the motion reversal of random unitary operators. In the simplest case our method enables direct estimation of the average gate fidelity. The more general fidelities are characterized by a universal exponential rate of fidelity loss. In all cases the measurable fidelity decrease is directly related to the strength of the noise affecting the implementation -- quantified by the trace of the superoperator describing the non--unitary dynamics. While the scalability of our stochastic protocol makes it most relevant in large Hilbert spaces (when quantum process tomography is infeasible), our method should be immediately useful for evaluating the degree of control that is achievable in any prototype quantum processing device. By varying over different experimental arrangements and error-correction strategies additional information about the noise can be determined.Comment: 8 pages; v2: published version (typos corrected; reference added

    Exploring the Benefits of Adaptive Sequential Designs in Time-to-Event Endpoint Settings

    Get PDF
    Sequential analysis is frequently employed to address ethical and financial issues in clinical trials. Sequential analysis may be performed using standard group sequential designs, or, more recently, with adaptive designs that use estimates of treatment effect to modify the maximal statistical information to be collected. In the general setting in which statistical information and clinical trial costs are functions of the number of subjects used, it has yet to be established whether there is any major efficiency advantage to adaptive designs over traditional group sequential designs. In survival analysis, however, statistical information (and hence efficiency) is most closely related to the observed number of events, while trial costs still depend on the number of patients accrued. As the number of subjects may dominate the cost of a trial, an adaptive design that specifies a reduced maximal possible sample size when an extreme treatment effect has been observed may allow early termination of accrual and therefore a more costefficient trial. We investigate and compare the tradeoffs between efficiency (as measured by average number of observed events required), power, and cost (a function of the number of subjects accrued and length of observation) for standard group sequential methods and an adaptive design that allows for early termination of accrual. We find that when certain trial design parameters are constrained, an adaptive approach to terminating subject accrual may improve upon the cost efficiency of a group sequential clinical trial investigating time-to-event endpoints. However, when the spectrum of group sequential designs considered is broadened, the advantage of the adaptive designs is less clear
    corecore